AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multilingual Twitter pre-training

# Multilingual Twitter pre-training

Arabic Xlm Xnli
MIT
Based on the XLM-Roberta-base model, continuously pre-trained on Arabic Twitter corpus and fine-tuned on the XNLI Arabic dataset for zero-shot text classification.
Text Classification Transformers Arabic
A
morit
268
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase